Global Convergence of the Dai-Yuan Conjugate Gradient Method with Perturbations
نویسندگان
چکیده
In this paper, the authors propose a class of Dai-Yuan (abbr. DY) conjugate gradient methods with linesearch in the presence of perturbations on general function and uniformly convex function respectively. Their iterate formula is xk+1 = xk + αk(sk + ωk), where the main direction sk is obtained by DY conjugate gradient method, ωk is perturbation term, and stepsize αk is determined by linesearch which does not tend to zero in the limit necessarily. The authors prove the global convergence of these methods under mild conditions. Preliminary computational experience is also reported.
منابع مشابه
Two Settings of the Dai-Liao Parameter Based on Modified Secant Equations
Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods, we introduce two new parameters based on the modified secant equation proposed by Li et al. (Comput. Optim. Appl. 202:523-539, 2007) with two approaches, which use an extended new conjugacy condition. The first is based on a modified descent three-term search direction, as the descent Hest...
متن کاملA Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization
A modification of the Dai-Yuan conjugate gradient algorithm is proposed. Using the exact line search, the algorithm reduces to the original version of the Dai and Yuan computational scheme. For inexact line search the algorithm satisfies both the sufficient descent and conjugacy condition. A global convergence result is proved when the Wolfe line search conditions are used. Computational result...
متن کاملAnother nonlinear conjugate gradient algorithm for unconstrained optimization
A nonlinear conjugate gradient algorithm which is a modification of the Dai and Yuan [Y.H. Dai and Y, Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim., 10 (1999), pp.177-182.] conjugate gradient algorithm satisfying a parametrized sufficient descent condition with a parameter k δ is proposed. The parameter k δ is computed by means of the conj...
متن کاملNew Accelerated Conjugate Gradient Algorithms for Unconstrained Optimization
New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of the Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algo...
متن کاملA Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. However, the strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally provided the line search satisses the standard Wolfe conditions. The condit...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- J. Systems Science & Complexity
دوره 20 شماره
صفحات -
تاریخ انتشار 2007